Skip to content
This repository has been archived by the owner on Dec 16, 2022. It is now read-only.

fix distributed loss #5380

Closed
wants to merge 2 commits into from
Closed

fix distributed loss #5380

wants to merge 2 commits into from

Conversation

epwalsh
Copy link
Member

@epwalsh epwalsh commented Aug 27, 2021

No description provided.

@epwalsh epwalsh requested a review from AkshitaB August 27, 2021 02:36
@AkshitaB
Copy link
Contributor

This fails the tests because training_util.get_metrics is called for each batch. I've moved the reduce logic to gradient_descent_trainer in this PR.

@epwalsh
Copy link
Member Author

epwalsh commented Aug 27, 2021

Closing in favor of #5381

@epwalsh epwalsh closed this Aug 27, 2021
@epwalsh epwalsh deleted the dist-loss-fix branch August 27, 2021 15:45
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants